Class of continuous level associative memory neural nets.

نویسنده

  • R J Marks Ii
چکیده

A neural net capable of restoring continuous level library vectors from memory is considered. As with Hopfield's neural net content addressable memory, the vectors in the memory library are used to program the neural interconnects. Given a portion of one of the library vectors, the net extrapolates the remainder. Sufficient conditions for convergence are stated. Effects of processor inexactitude and net faults are discussed. A more efficient computational technique for performing the memory extrapolation (at the cost of fault tolerance) is derived. The special case of table lookup memories is addressed specifically.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Fast Learning Algorithm for Deep Belief Nets

We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorit...

متن کامل

Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization

Learning to remember long sequences remains a challenging task for recurrent neural networks. Register memory and attention mechanisms were both proposed to resolve the issue with either high computational cost to retain memory differentiability, or by discounting the RNN representation learning towards encoding shorter local contexts than encouraging long sequence encoding. Associative memory,...

متن کامل

The Hopfield Model with Multi-Level Neurons

THE HOPFIELD MODEL WITH MUL TI-LEVEL NEURONS Michael Fleisher Department of Electrical Engineering Technion Israel Institute of Technology Haifa 32000, Israel The Hopfield neural network. model for associative memory is generalized. The generalization replaces two state neurons by neurons taking a richer set of values. Two classes of neuron input output relations are developed guaranteeing conv...

متن کامل

5. Recurrent Networks 5.1 Basic Concepts of Associative Memory — Hopfield Nets

So far we have been studying networks that only have forward connections and thus there have been no loops. Networks with loops — recurrent connections is the technical term used in the literature — have an internal dynamics. The networks that we have introduced so far have only retained information about the past in terms of their weights, which have been changed according to the learning rule...

متن کامل

Weakly pulse-coupled oscillators, FM interactions, synchronization, and oscillatory associative memory

We study pulse-coupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piece-wise continuous change of variables into a phase model, whose synchronization behavior and oscillatory associative properties are easier to analyze and understand. Using the phase model, we can pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Applied optics

دوره 26 10  شماره 

صفحات  -

تاریخ انتشار 1987